Simple bounds for recovering low-complexity models

نویسندگان

  • Emmanuel J. Candès
  • Benjamin Recht
چکیده

This note presents a unified analysis of the recovery of simple objects from random linear measurements. When the linear functionals are Gaussian, we show that an s-sparse vector in R can be efficiently recovered from 2s log n measurements with high probability and a rank r, n×n matrix can be efficiently recovered from r(6n − 5r) measurements with high probability. For sparse vectors, this is within an additive factor of the best known nonasymptotic bounds. For low-rank matrices, this matches the best known bounds. We present a parallel analysis for blocksparse vectors obtaining similarly tight bounds. In the case of sparse and block-sparse signals, we additionally demonstrate that our bounds are only slightly weakened when the measurement map is a random sign matrix. Our results are based on analyzing a particular dual point which certifies optimality conditions of the respective convex programming problem. Our calculations rely only on standard large deviation inequalities and our analysis is self-contained.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery

Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms of the unfoldings of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way tensor of length n and Tucker rank r from Gaussian measurements...

متن کامل

Square Deal: Lower Bounds and Improved Convex Relaxations for Tensor Recovery

Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing andmachine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a Kway n⇥n⇥· · ·⇥n tensor of Tucker rank (r, r, . . . , r)...

متن کامل

Learning Graphs with a Few Hubs

We consider the problem of recovering the graph structure of a “hub-networked” Ising model given i.i.d. samples, under high-dimensional settings, where number of nodes p could be potentially larger than the number of samples n. By a “hub-networked” graph, we mean a graph with a few “hub nodes” with very large degrees. State of the art estimators for Ising models have a sample complexity that sc...

متن کامل

Dynamic matrix recovery from incomplete observations under an exact low-rank constraint

Low-rank matrix factorizations arise in a wide variety of applications – including recommendation systems, topic models, and source separation, to name just a few. In these and many other applications, it has been widely noted that by incorporating temporal information and allowing for the possibility of time-varying models, significant improvements are possible in practice. However, despite th...

متن کامل

Cognitive Task Complexity and Iranian EFL Learners’ Written Linguistic Performance across Writing Proficiency Levels

Recently tasks, as the basic units of syllabi, and the cognitive complexity, as the criterion for sequencing them, have caught many second language researchers’ attention. This study sought to explore the effect of utilizing the cognitively simple and complex tasks on high- and low-proficient EFL Iranian writers’ linguistic performance, i.e., fluency, accuracy, lexical complexity, and structura...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Program.

دوره 141  شماره 

صفحات  -

تاریخ انتشار 2013